14 research outputs found

    Feature selection for chemical sensor arrays using mutual information

    Get PDF
    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays

    Dynamical Principles of Emotion-Cognition Interaction: Mathematical Images of Mental Disorders

    Get PDF
    The key contribution of this work is to introduce a mathematical framework to understand self-organized dynamics in the brain that can explain certain aspects of itinerant behavior. Specifically, we introduce a model based upon the coupling of generalized Lotka-Volterra systems. This coupling is based upon competition for common resources. The system can be regarded as a normal or canonical form for any distributed system that shows self-organized dynamics that entail winnerless competition. Crucially, we will show that some of the fundamental instabilities that arise in these coupled systems are remarkably similar to endogenous activity seen in the brain (using EEG and fMRI). Furthermore, by changing a small subset of the system's parameters we can produce bifurcations and metastable sequential dynamics changing, which bear a remarkable similarity to pathological brain states seen in psychiatry. In what follows, we will consider the coupling of two macroscopic modes of brain activity, which, in a purely descriptive fashion, we will label as cognitive and emotional modes. Our aim is to examine the dynamical structures that emerge when coupling these two modes and relate them tentatively to brain activity in normal and non-normal states

    A Boolean Hebb rule for binary associative memory design

    No full text
    A binary associative memory design procedure that gives a Hopfield network with a symmetric binary weight matrix is introduced in this paper. The proposed method is based on introducing the memory vectors as maximal independent sets to an undirected graph, which is constructed by Boolean operations analogous to the conventional Hebb rule. The parameters of the resulting network is then determined via the adjacency matrix of this graph in order to find a maximal independent set whose characteristic vector is close to the given distorted vector. We show that the method provides attractiveness for each memory vector and avoids spurious memories whenever the set of given memory vectors satisfy certain compatibility conditions, which implicitly imply sparsity. The applicability of the design method is finally investigated by a quantitative analysis of the compatibility conditions

    A Boolean Hebb rule for binary associative memory design

    No full text
    We propose a binary associative memory design method to be applied to a class of dynamical neural networks. The method is based on introducing the memory vectors as maximal independent sets to an undirected graph and on designing a dynamical network in order to find a maximal independent set whose characteristic vector is close to the given distorted vector. We show that our method provides the attractiveness for each memory vector and avoids the occurance of spurious states whenever the set of given memory vectors satisfies certain compatibility conditions. We also analyze the application of this design method to the discrete Hopfield network

    A new design method for the complex-valued multistate Hopfield associative memory

    No full text
    A method to store each element of an integral memory set M subset of {1,2,...,K}(n) as a fixed point into a complex-valued multistate Hopfield network is introduced. The method employs a set of inequalities to render each memory pattern as a strict local minimum of a quadratic energy landscape. Based on the solution of this system,. it gives a recurrent network of n multistate neurons with complex and. symmetric synaptic weights, which operates on the finite state space {1, 2,...,K}(n) to minimize this quadratic functional. Maximum number of integral vectors that can be embedded into the energy landscape of the network. by this method is investigated by computer experiments. This paper also enlightens the performance of the proposed method in reconstructing noisy gray-scale images

    Construction of energy landscape for discrete Hopfield associative memory with guaranteed error correction capability

    No full text
    An energy function-based auto-associative memory design method to store a given set of unipolar binary memory vectors as attractive fixed points of an asynchronous discrete Hopfield network is presented. The discrete quadratic energy function whose local minima correspond to the attractive fixed points of the network is constructed via solving a system of linear inequalities derived from the strict local minimality conditions. In spite of its computational complexity, the method performs better than the conventional design methods, also ensuring the attractiveness for almost all memory sets whose cardinality is less than or equal to the dimension of its elements, as verified by computer simulations

    An energy function-based design method for discrete Hopfield associative memory with attractive fixed points

    No full text
    An energy function-based autoassociative memory design method to store a given set of unipolar binary memory vectors as attractive fixed points of an asynchronous discrete Hopfield network (DHN) is presented. The discrete quadratic energy function whose local minima correspond to the attractive fixed points of the network is constructed via solving a system of linear inequalities derived from the strict local minimality conditions. The weights and the thresholds are then calculated using this energy function. If the inequality system is infeasible, we conclude that no such asynchronous DHN exists, and extend the method to design a discrete piecewise quadratic energy function, which can be minimized by a generalized version of the conventional DHN, also proposed herein. In spite of its computational complexity, computer simulations indicate that the original method performs better than the conventional design methods in the sense that the memory can store, and provide the attractiveness for almost all memory sets whose cardinality is less than or equal to the dimension of its elements. The overall method, together with its extension, guarantees the storage of an arbitrary collection of memory vectors, which are mutually at least two Hamming distances away from each other, in the resulting network
    corecore